home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Danny Amor's Online Library
/
Danny Amor's Online Library - Volume 1.iso
/
bbs
/
society
/
society.lha
/
PUB
/
TESTIMONY
/
cocker-security.txt
Wrap
Text File
|
1995-07-21
|
29KB
|
537 lines
=======================================================================
Testimony before the House Subcommittee on Science
Concerning the Security of the Internet
March 22, 1994
Stephen D. Crocker
Trusted Information Systems
3060 Washington Road
Glenwood, MD 21738
301-854-6889
crocker@tis.com
INTRODUCTION
My name is Stephen Crocker. I am a vice president of Trusted
Information Systems, a small company in Maryland with branch offices
in California and the United Kingdom which specializes in computer and
network security research, consulting and products. For the past four
years I have served as the Area Director for Security in the Internet
Engineering Task Force (IETF), and it is in this capacity that I
appear here today.
The IETF is the technical group developing and standardizing the
protocols used on the Internet. Protocols are agreements on the
format and sequencing of messages sent over the network. As Area
Director for Security, I oversee the work on security protocols and on
the security aspects of the other IETF standards activities. Next
week I will complete my term, and begin a two year term on the
Internet Architecture Board (IAB). Jeff Schiller of MIT will be the
next Area Director for Security, and I have shared these remarks with
him.
Your subcommittee is investigating the causes of the recent security
incidents on the Internet and seeking to understand what might be done
to prevent similar incidents in the future. This is important in its
own right since there are an estimated ten to twenty million people
using the Internet around the world, but it is even more important
because the Internet is both a model for and a key building block of the
National Information Infrastructure (NII).
Before addressing these questions directly, I'd like to review the
history of the Internet with particular attention to the design of the
security protection.
HISTORY, PART I: THE ARPANET
The current Internet grew out of the original Arpanet, which was
initiated in 1969. Vinton Cerf, who's also testifying today, and I
had the honor of connecting the first host computer to the first
Interface Message Processor (IMP) at UCLA, the first site on the
Arpanet. At that time, ARPA planned to build a network consisting of
four nodes and then to expand it if the experiment was successful.
The original design permitted about sixty sites to be part of the
network, with no more than four computers per site. Although that
seems small today, it was far larger than any previous attempt to
connect distant computers together.
ARPA had planned in detail how to connect the computers together and
how to move information between them. What they did not plan very
completely was what the computers should say to each other. They did,
however, have the foresight to install the Arpanet into the sites they
were supporting for advanced research and development in information
processing. As might be expected, the research community went right
to work to figure out what to do with this new capability.
An ad hoc Network Working Group was formed by the network sites, and its
job was to define the protocols of what the computers should say to each
other. We drew from the analogies of how we used individual computers.
In those days, the computers in common use in the research community
were time-shared systems. Between a dozen and several dozen people
would use a computer at the same time. Personal computers and
workstations had not yet been invented.
The intensive use of time-shared computers had already introduced us to
computer security problems. Quite a lot of research and development had
already taken place, and there were controls in place on almost all
computers to protect one user from being interfered with by another
user. The first part of that protection was based on passwords. Each
user had to identify himself and then type a secret password before he
was able to use the computer. Once a user was "logged in," there were
additional controls internally on what each user could do.
These controls were not perfect, but they were moderately satisfactory.
If a user chose a poor password, another user might try to guess it and
thereby impersonate him. Abuses of this form occurred, but they usually
occurred within limited communities.
During the early stages of the Arpanet, we took these time-shared
computers and connected them to each other. We of course recognized
that security was important, but we envisioned the threat to be the same
as on individual time-shared computers. Accordingly, our first
protocols provided the equivalent of a dial-up connection from one
computer to another. When a user connected over the Arpanet from her
local machine to a remote machine, the user had to identify herself to
the remote computer. This meant the user had to have an account on the
remote computer and she had to type her password for that account.
This scheme worked reasonably well for a while, but gradually the
network grew very large, and with that growth came new problems.
HISTORY, PART II: THE INTERNET
During the 1970s, several events caused the network to become much,
much larger. First, the Ethernet was invented. This made it possible
to connect dozens of computers together in a building or on a campus.
The Ethernet was an example of what is now called a "local area
network" or LAN. Other technologies were also developed for LANs, but
the Ethernet was by far the most successful and caused networking to
explode.
Local networks would have no place in our story except that they were
also connected to the Arpanet. Where previously only four computers
at one site could be part of the Arpanet, it was now possible to
connect several dozen computers at one site to the national network.
It also became clear that one common wide area network would not
suffice. The Arpanet could accommodate only 63 sites, and there was a
need to connect hundreds or thousands of sites. Many new networks
were built. The appearance of the Arpanet in the U.S. also triggered
an urgency in other countries to build their own networks. Canada,
the United Kingdom and France were among the first. Also, different
organizations built their own networks. Some of these were private
networks inside corporations, universities or government agencies, and
others were built to serve a broader research community. The National
Science Foundation, The Department of Energy and NASA all moved
aggressively into this arena to the benefit of their communities and
the country as a whole.
Almost all of these networks, local and national, needed to be
connected together. The purpose of a network is to provide
connectivity for a large number of sites. Separate, disconnected
networks were as awkward as the separate telephone networks in the
early part of this century.
Out of this need emerged a vision of how multiple networks could be
connected together without requiring any single network to be in
control. Robert Kahn and Vinton Cerf designed a new set of protocols,
TCP and IP. These protocols make it possible for a wide range of
networks to be connected to each other, and for a computer on one
network to communicate with a computer on any other network.
Thus was born the Internet, which is not a single network, but an
open- ended cross connection of local area networks and wide area
networks, each separately administered, but all adhering to the same
basic protocols. Today, the Internet consists of thousands of
networks, hundreds of thousands of computers, and an estimated ten to
twenty million users. The Internet is operational in a majority of
countries. In the U.S., the Internet connects virtually all research
universities, many Fortune 500 companies, and almost all of the
information processing industry, and the Internet is now spreading to
the K through 12 community.
SECURITY IN THE EARLY INTERNET -- THE RISE IN BREAKINS
With this rapid increase in the size of the Internet, the security
issues also changed. A small percentage of the users explored the
Internet with the aim of breaking into systems. For the most part,
these were youngsters, their interest was thrill seeking, and they
rarely caused any damage. Occasionally the purpose was more sinister,
for example to do harm for revenge or to exploit a system for personal
gain.
Another consequence of the very rapid increase in the size of the
network is a very uneven level of security awareness. The two most
common weaknesses in computers connected to the Internet are
configuration errors and poorly chosen passwords. A typical
configuration error is for a computer to have an account which does
not require a password. Examples of poorly chosen passwords are
someone's name or an English word.
The mixture of idle youngsters looking for thrills and thousands of
poorly administered computers was the perfect chemistry for trouble.
Breakins occurred at numerous sites. Most of these were innocuous,
but nonetheless disturbing, and occasionally these skills were used
for more troublesome purposes. It was not uncommon for the
penetrators to attempt to erase evidence of their penetrations, and it
was also not uncommon for penetrators to leave behind trap doors to
facilitate surreptitious access at a later time.
Local administrators struggled with these breakins. When the network
was small, most of the administrators knew each other and could
cooperate easily. As the network grew larger, it was not as easy to
maintain such contact. Also, it was difficult to gain much attention
from law enforcement agencies because they had little experience with
network technology, the laws were not completely clear on the status of
various acts, and the jurisdictions were sometimes undefined.
The most readable and vivid account of the interplay of these
different effects was given by Clifford Stoll in his book "The
Cuckoo's Egg." Clifford Stoll, an astronomer and computer scientist
at Lawrence Berkeley Laboratory (LBL), was pressed into service as a
computer administrator. He discovered a small discrepancy in the
accounting system reports. What ensued was a remarkable tale of
persistence on both his part and the part of the penetrators. He
discovered that the discrepancies had come from the not quite perfect
attempt of a penetrator to cover his tracks by erasing all evidence
that an unauthorized user had consumed computer time. Unbeknownst to
the penetrator, LBL was using two accounting systems, one provided by
the vendor, and one the LBL staff had built and installed themselves.
The penetrator was unaware of the LBL accounting system, and thus left
behind one small clue.
One of the fruits of Dr. Stoll's work was much better rapport with the
law enforcement community. I doubt anyone would suggest the FBI,
Secret Service and various local police agencies are yet completely
equipped to deal with network penetrations, but their skill and
experience and, perhaps more importantly, their attention to this type
of problem, is certainly improving.
THE MORRIS WORM INCIDENT
Most of the breakins across the Internet were carried out by people
with modest skill and much patience. They often wrote programs to
help them break into computers, but the programs tended to be
relatively simple.
However, in 1988, Robert Morris Jr., then a graduate student at
Cornell University, demonstrated that one could break into a large
number of computers in an automated way. He wrote a program which
exploited poorly chosen passwords and configuration errors, and which
also exploited flaws in the software delivered by the vendor. The
most striking feature of Morris' software is that once it broke into
one computer, it would use that computer as a platform to stage
breakins at other computers. Since it is common for groups of
computers to be operated with mutual trust, it was sometimes easier to
break into a computer from a nearby mutually trusted computer than it
would have been to break into from afar. The self-propagating nature
of Morris' program is known as a "worm."
Within hours after his software was released, it had traveled to
several thousand computers. Although his intent had been for the worm
to exist unnoticed for a long time, it was quickly discovered and
eradicated. Researchers and computer system administrators at MIT,
University of California, Berkeley and other sites worked around the
clock to understand and counter the worm, and it was brought under
control within a day or two.
The incident brought the Internet community together almost overnight,
and ARPA established the Computer Emergency Response Team (CERT) in
Pittsburgh to collect information on security incidents and to assist
in coordination of future incidents. Dain Gary, the head of the CERT,
is also testifying today and can provide a picture of how active the
CERT has been since its establishment five years ago.
The Morris worm caused vendors to review their software for security
weaknesses, and to respond quickly and vigorously to reports of flaws
in their software. Sun Microsystems, Digital Equipment Corporation,
and others set up special procedures for fixing security-relevant
flaws and releasing those fixes to the network community. The CERT
has worked with vendors to create these channels. These channels have
been helpful, but I believe there needs to be even closer cooperation
between the users and the vendors.
THE NEW THREAT: PASSWORD SNIFFING
The rash of incidents which is the subject of this hearing is the wide
spread appearance of "sniffing" programs which capture passwords while
they are in transit across the Internet. These programs make use of
the fact that a computer on an Ethernet -- and most other kinds of
local area networks -- can choose to read all of the messages on that
network, even if they are not addressed to that particular computer.
Usually a computer only examines messages addressed to that computer,
but the penetrators can insert sniffing programs in a variety of
computers across the network. Using these programs, they look for
passwords, and then make use of those passwords to break into other
computers. It is estimated that several thousand passwords have been
compromised.
This threat is serious and needs to be countered. It is no longer
appropriate to transmit passwords in the clear across the Internet, or
indeed across any local area network, and security measures must
evolve to provide stronger protection. Let me emphasize that this
threat is not limited to the Internet. The same threat exists in
every local area network, whether or not it is connected to the
Internet.
It should also be understood that this set of incidents tell us that
protecting passwords is not enough. The same sniffer technology makes
it possible to simply record "interesting" messages. Even if we
improve the protection of passwords, we will shortly need to protect
all of the traffic on networks.
Part of the work necessary to counter this threat has already been
done, but additional work is needed. Before detailing the specific
measures needed to thwart this kind of attack, I want to give a little
bit of background on the Internet Engineering Task Force, where
technical developments of Internet protocols are carried out.
THE INTERNET ENGINEERING TASK FORCE
Computers on the Internet communicate using standard protocols, which
are agreements on the format and sequencing of messages sent over the
network. I mentioned the TCP and IP protocols earlier. These are the
core protocols that knit the Internet together, but they are by no
means the only protocols. Actually dozens of protocols are in daily
use on the Internet. They govern the way electronic mail is
transmitted, the way a user on one computer logs in remotely to
another computer, the way files are transferred, the way information
is published through "gopher" and the World Wide Web. There is a
large number of protocols, more are being created every year. The
Internet is a vibrant and active system, and it is constantly evolving
to provide new services and to operate in new environments.
Each protocol represents a substantial technical effort, involving
groups of people across several organizations. Once the technical
specification for a protocol is complete, companies and other
organizations implement it. These efforts represent the investment of
millions of dollars of development and risk each year.
Because protocols govern the way computers interact with each other
across the Internet and affect virtually computer systems, protocols
must be developed in an open, fair and efficient environment. The
Internet Engineering Task Force (IETF) provides that forum. At any
given time there are usually sixty to eighty working groups in the
IETF, each developing a specific protocol. At the present time, we
have working groups focused on new applications such as video
conferencing over the Internet, on the use of higher speed media
including gigabit technology, and on broadening the Internet to
potentially serve several billion people worldwide.
The IETF is open to anyone. Its work is carried out by electronic
mail and through regular meetings. Everyone is welcome to join the
mailing lists or to attend the working meetings. Decisions are made
by consensus, and bit of care is taken to make sure that everyone has
an opportunity to be heard.
To facilitate and manage the IETF, the working groups are divided into
several areas. A steering group which consists of the area directors
and the chair of the IETF oversees all of the working groups. One of
these areas is Security, and its purpose is to develop protocols which
improve the security of the data networking. As is obvious from our
appearance here today, there is more work to be done.
SECURITY PROTOCOLS
As I outlined above, when the Arpanet was first built, we took security
seriously and imposed the same controls across the network that were
already being employed within each computer. It is therefore quite
commonplace for a user to have to identify herself when she wants to use
a remote computer, and for her to prove that her identity is accurate by
typing a password. As we have also seen, such measures are not
completely adequate, partly because passwords are often poorly chosen,
and in light of the most recent events, it is also evident that
passwords may be discovered by sophisticated eavesdropping.
For approximately twenty years there has been active research to
develop stronger forms of protection in networks, and over the last
ten years, that work has focused on developing specific protocols to
enhance security in the Internet. Protocols have been designed to add
privacy and authenticity to electronic mail, to facilitate remote use
of facilities without passing passwords, to protect control messages
used to manage remote devices, and others.
These protocols all require the use of cryptography. In some cases the
cryptography is used to protect messages against tampering or to provide
assurance that the sender is who he says he is. In other cases the
cryptography is used to protect the contents of a message from being
seen while it is being sent across the Internet.
Cryptography is not a simple subject, and it consists of a variety of
techniques. The most common technique is scrambling a message
according to a secret code. If the person receiving the message also
knows the code, then she can unscramble it. In technical parlance,
this is known as a symmetric key cipher because the same code, or
"key", is used to scramble and unscramble the message.
In the 1970s and 1980s, there was a remarkable breakthrough in
cryptography with the invention of asymmetric, or "public key"
cryptography. What's different about asymmetric cryptography is that
the person scrambling the message and the person unscrambling the
message use different, but related, keys, and the person with one key
cannot figure out what the other key is. This makes it possible for one
of the keys, say the scrambling key, to be advertised widely, without
disclosing what the corresponding unscrambling key is.
Asymmetric cryptography is important because it is extremely well
matched to the needs of computer networks. It makes it possible to
use cryptography routinely on a very wide scale.
Research in cryptography has been an important stimulus for developing
better security protection in networks. Several security protocols
have been developed for the Internet which employ cryptographic
techniques. However, these protocols have not yet been very widely
implemented or deployed. I think the prospects for using these
stronger protocols are good, but the path is not completely clear, and
I want to briefly examine the reasons.
IMPEDIMENTS
There are two reasons why cryptography is not more widely used in the
Internet today. One is ease of use; the other is government policy.
Until recently, the most common form of direct interaction over the
Internet has been logging into a remote computer. As noted before,
the same protection is employed for these remote connections as is
used locally, viz passwords. Passwords are still reasonably useful in
a controlled environment, but they are vulnerable when exposed across
a network. However, the alternative is to use some form of challenge-
response system. In a challenge-response system, there is still a
shared secret, but it is not transmitted over the network. Instead,
when the user attempts to connect to a remote computer, the remote
computer sends the user a randomly chosen number. The user must
demonstrate that he knows the shared secret by scrambling the random
number in accordance with the shared secret. If he does so correctly
and sends the scrambled number back to the remote computer, the remote
computer knows it is communicating with the real user and not some
imposter. This system is secure because the remote computer chooses a
different random number challenge each time. An eavesdropper cannot
determine the secret key from listening to the exchange, and he cannot
impersonate the user because the remote computer never chooses the
same challenge.
One difficulty with this scheme is the scrambling process requires
some computation. (It is also possible to use a preprinted list of
responses.) In the past, it has been commonplace to assume the user
has no computational capability because she may be using an ordinary
"dumb" terminal. To use this form of challenge response system, many
users now carry small devices that look like calculators to compute
the response when presented with a challenge. However, this is
somewhat burdensome and expensive, and is not used universally.
Fewer and fewer dumb terminals exist. Most users have a personal
computer or something equivalent on their end of the connection, so
the computation can be carried out within the personal computer. This
is particularly important for travelers and others using services
remotely. The growing popularity of laptop computers sets the stage
for widespread use of challenge-response security in remote access
protocols.
The other factor that has limited the deployment of cryptographic
protection is government policy, particularly the export regulations.
Cryptographic products have been classified as munitions and
controlled under the International Traffic in Arms Regulations (ITAR).
Without going into the complete history of the regulations, the effect
has been to squelch the inclusion of cryptographic protection in the
broad range of computer and network products. U.S. computer companies
compete in a world market. Since domestic sales typically account for
only half of total sales, products excluded from the world market are
generally not worth manufacturing. Furthermore, since a significant
portion of the total Internet traffic is between businesses,
universities and research institutions internationally, the design of
security systems, both on today's Internet and tomorrow's NII, needs
to be globally acceptable.
Over the past couple of years, there has been considerable attention
to the counter-intuitive impact of U.S. policy on the security of the
Internet. The intelligence community maintains that export controls
are essential, even though quite strong cryptographic software is
available freely from foreign companies and from foreign sites on the
Internet. The Commerce Department's nearly 20 year old DES symmetric
algorithm is published in numerous widely used textbooks dealing with
computer security. In addition to freely available software
worldwide, hardware implementations are available around the world and
may be ordered for overnight delivery into the U.S. from abroad. Last
fall, the president of our company, Steve Walker, demonstrated the
efficacy of DES software obtained from multiple countries in testimony
before Rep. Gejdenson's subcommittee. He used DES software products
obtained from abroad to encrypt speech and music being transmitted
over a local area network using ordinary, unmodified workstations.
Export control of cryptography is part of a larger picture of government
policy related to cryptography in general. Three other initiatives are
now visible as part of the same picture.
NIST has been developing a government standard for digital
signatures. This is peculiar because industry is already using
the RSA algorithm for digital signatures, and there is no reason
for the government to do anything but adopt this robust and well
proven technology. However, NIST has proposed a different
algorithm which has no technical advantages and is clouded with
unresolved patent disputes.
The only explanation for NIST's strategy is that the RSA
algorithm is usable for encrypted communication as well as for
digital signatures. NIST's strategy, at the behest of the
intelligence community, is to attempt to split apart digital
signature technology from encryption technology, thereby
forestalling the wider use of encryption. NIST's attempt to
create a government standard with a different algorithm is an
expensive and diversionary course, whose sole justification and
effect can only be that it may delay the wider use of the RSA
algorithm.
The law enforcement community also maintains that widespread use
of cryptography would affect its ability to carry out its
mission. On that account, the government has proposed the use of
an escrowed key encryption system known as Clipper. Clipper
requires the use of special hardware. This initiative fails
several tests, including the cost of the hardware, strong
resistance from the U.S. community, and the virtual impossibility
of selling such systems to the foreign market, which accounts for
50% or more of the market.
The FBI has asked that legislation be introduced requiring that
all telecommunication facilities be easy to wiretap.
Taken together, these initiatives make it clear that the government
has a strong desire to be able to read the mail, both foreign and
domestic. This desire takes precedence over strengthening the
security of the information infrastructure. For example, there are no
initiatives directed towards assuring that products come "network
safe" so they are not vulnerable to attack when they are plugged into
networks. There are no initiatives which provide protection for
companies which attempt to deal forthrightly with possible flaws in
their products.
This set of priorities is unfortunate. It will retard the development
of the National Information Infrastructure, and it will diminish the
U.S. market share in the information industry, an area in which U.S.
has established a clear competitive lead and dominant market position.
Fortunately, the effects are limited. The general requirement for
strong security is now widely understood in the network community.
Protocols which embody strong protection are being defined, developed
and documented. Implementations will come into existence around the
world. The export controls may retard these developments by a few
years, at most. Nonetheless we can expect to see widespread
protection of network traffic, and correspondingly fewer penetrations
from both amateurs and professionals. The only interesting question
is whether the software which protects us will be made in the U.S.A.
or imported from abroad.
=======================================================================